Movement + Processing + SuperCollider

Early this year (2012) i was involved in a creative development. One of the out-comes of this project was a real-time movement based granular synthesis setup. Taking output from a web cam, using Processing to run a blob detection algorithm on the video feed, and then communicating using Open Sound Control to SuperCollider, that ran a granular synth.

The goal was simple. To map a performers movement in space to control sound. Over 2 days, i developed are small framework to explore some of the possibilities. Then another 4 days were spent exploring and refactoring the framework as the creative team discovered and explored this setup.

I had used Supercollider a gazzilion times before for sound design projects. But this time i threw myself into using more code and more tech to create and generate the sound design content. And i really enjoyed the process. The coming together of two disciplines seemed natural and suited my head and my aesthetics.

It was also a great opportunity to combine creativity and coding.

Creative coding : i hope to do more.

Thanks to Rinske Ginsberg for asking to work on the project and Dan Witton for permission to use this raw video.